33 research outputs found

    Getting it right: further investigations on the impact of an online writing tool used by first year biology students

    Get PDF
    An online writing tool that incorporates summative assessment was developed in 2007 and trialled in 2008 with first year biology students (Lilje, Breen, Lewis & Yalcin 2008a). The Online Writing Evaluation Tool (ORWET) was evaluated by both the staff, who use a staff version to help them mark student report writing, and the students, who use a student version to help them write a scientific report. Whilst the staff considered their tool to be useful to their marking, and the analysis of marking indicated that there is strong evidence of consistency in marking across a large group of markers (Lilje, Breen, Lewis &Yalcin 2008b), the student story was very different. Even if the students thought the tool was useful to their learning to write a scientific report, they did not like using it (Lilje et al. 2008a). There was an avalanche of comments to open-ended questions asking about the strengths of the tool and how it could be improved. ORWET was originally designed to be used in a second semester course, when it has been suggested students are better acclimatised to being university students. Unfortunately the course was moved to first semester (out of the control of the course developer - OL) without much consideration of the nature of the activities within the course and the readiness of students to be self-reliant and disciplined. Some of the recurring comments of the students were probably to do with their transition from school to university and these are being addressed. Version 2 of ORWET is in use in semester 1 2009. This paper will report on a re-evaluation of the impact of the tool by investigating if using the tool enhances students' understanding of what is required of them, and whether it enhances scientific writing skills

    Teaching Human Biology to Large First Year Classes: an eLearning Journey for Students and Staff

    Get PDF
    This paper reports on a journey that has seen the remodelling and redevelopment over a ten year period of a first year human biology course. The students enrolled in the current course are enrolled in many different degree programs (about 15) and thus come with different expectations and aspirations. The course was first introduced in 1996 as a second semester course that not only assumed the knowledge from a prior first semester tertiary level biology course but also the benefit of a semester of tertiary study where the emphasis was student-centred rather than teacher-directed. It is currently a first semester course and additional help is being added to provide for the transition from school to university. The move from a fully face-to-face to a blended learning environment started in 1998 with the replacement of one lecture a week with an independent student activity that could be done anywhere and at anytime. Then, in response to student requests, more of the content of the curriculum was presented in the online environment with an emphasis on blending the face-to-face activities with the online components, using these online components to direct the overall learning activities. Evidence that indicates we are providing a supportive blended learning environment is reported here

    Use of traditional and elearning components in a blended learning environment

    Get PDF
    Structural changes to an advanced first year human biology course integrated an eLearning component with traditional lectures and laboratory classes. An investigation of use and perceptions of usefulness indicate that the components have been successfully blended. In 2005 the course underwent a major curriculum review with a new structure emerging in which a significant amount of content was moved online. The online component became the focal point of the course with the remaining face-to-face activities (lectures and practical classes) blended with the new online component in a manner that emphasised equal linkage between them. This was very different from the previous course where online materials had been perceived by the students as supplemental to the course and not central, and they had reported their use of these to be essentially for revision. Over a two year period we have been investigating the use and students perceptions of usefulness of all the resources available in this course. We wanted to see if the way in which we had blended the learning materials changed the way in which students used them and if their perceptions of usefulness had changed. As we believed the blended model offered the students better opportunities for deep learning, we wanted to see if their written responses to short answer examination questions improved as a result of the new course format

    A pilot study on the impact of an online writing tool used by first year science students

    Get PDF
    A recently developed Online Report Writing Evaluation Tool (ORWET) is a summative assessment tool that was introduced in 2008 into the first year Human Biology (HB) course in the School of Biological Sciences, The University of Sydney. ORWET aims to improve students’ understanding of scientific report writing. The tool presents sample scientific reports to students for marking, using the same criterion-based marking scheme provided to staff members when marking reports. The interactive environment of ORWET allows students to test their understanding of what makes a good scientific report. It also ensures students are made aware of the marking criteria and how reports are marked before producing their own scientific report. The reflective process is encouraged by the timely feedback provided by ORWET in response to students’ multiple marking attempts. ORWET has been integrated into the course structure as a summative assessment activity in an attempt to maximise students’ perception of usefulness of the online component (Lilje, Krishnan and Peat 2007). The eLearning tool complements the traditional experimental and reporting assessment activity thereby reinforcing the blended learning environment of the Human Biology course (Lilje and Peat 2006). This paper discusses students’ responses to ORWET and how it has impacted on the overall standard of scientific writing submitted by the student cohort

    The structure, use and impact of the staff version of ORWET

    Get PDF
    Human Biology (HB) is a large junior course in the School of Biological Sciences, The University of Sydney that employs 15-20 casual staff members to help teach in the laboratory classes and assist in the marking of summative assessment activities. There is usually a considerable turnover of staff, which means a varying level of marking experience from year to year. A staff version of the Online Report Writing Evaluation Tool (ORWET) has been created and used in 2007 and 2008 to increase staff awareness and interpretation of the marking criteria for one of the assessment activities, scientific report writing. ORWET complements strategies already in place to train markers by providing a flexible learning environment from which staff can practise marking worked examples of the report. ORWET provides detailed feedback for each sample report so that staff can compare their interpretation of the marking criteria to the desired standard set out in ORWET. The tool aims to increase markers’ confidence in marking and hence the quality of the feedback provided to students. Consistency in marking between multiple markers in a large course will also increase student confidence in the marking process. This paper describes the structure of the staff version of ORWET, its influence on consistent marking practices and the results of a staff evaluation of the tool

    Challenges for students in the transition to communicating as biologists

    Get PDF
    Context Using the scientific literature and communicating scientific research findings are essential components of undergraduate degree programs (Brownall et al 2013), and we have integrated academic writing and peer review into the biology curriculum since 1992. More recently the use of independent student inquiry activities has required us to focus on introducing students to the use of the primary literature (Healey and Jenkins 2009, Moscovitz and Kellogg 2011). First year students now engage with novel research experiments in laboratory courses, where they work with experimental design, data collection and interpretation, and reporting in the format of a journal article. As part of this process we expect them to access, read and incorporate information from the primary research literature into their report. Support for writing and reflection is incorporated through peer review sessions where students can give and receive feedback for further improvement of their writing. Problems Students struggle with searching for, and reading, the relevant literature, extracting information from research articles and identifying key conclusions (van Lacum et al 2012). Most students remain confused by this novel environment and we predict that lack of engagement with the literature leads to poor outcomes in report writing. The concept of paraphrasing and its relationship with a conceptual understanding of the journal text also remains a challenge for most students (Pittam et al 2009) and is reflected in ongoing issues around potential plagiarism. Meanwhile the peer review process leads to many misunderstandings as students expect one-to-one teacher feedback and cannot develop the independence or confidence to help others, while reflecting on their own work (Orsmond and Merry 2013, Nicol et al 2014). Methodology and Results Students in two first semester units of study (n=2500) were asked to provide access to their draft and final laboratory reports. Quantitative data included draft and final report marks, and the reports were subjected to a phenomenographical analysis to determine variation in the following items: a) extent and quality of text changes between draft and final reports, b) relationship between report marks, final grades and use of the primary literature c) use of the literature to develop scientific arguments. In addition, staff who teach in the units, and mark the drafts and reports, were interviewed to discuss key issues in the peer review and online marking processes. Preliminary results indicate that while a sub-set of students are able to identify appropriate research studies to cite in their own writing, can effectively paraphrase information and develop arguments to place their experimental results in a broader research context, many students find it difficult to recognize appropriate papers to use in their writing, cannot identify the relevant information and therefore struggle with the writing process. The two groups approach the peer review process with different expectations and their learning can be further impeded by an inability to reflect on their own writing and make subsequent improvements

    Providing personalised targeted feedback in large first-year biology units

    Get PDF
    Providing high quality, personalised feedback to first year students in large units of study can be problematic as the process can be onerous in terms of staff time and cost, and inconsistent in quality due to large numbers of teaching staff. During the shift to online learning due to COVID-19, opportunities to provide feedback were even more limited, as the capacity to give immediate verbal feedback on student work in practicals was reduced. Here we describe a way that we were able to create a feedback system that provided students with weekly, personalised feedback on their laboratory notebook entries, targeting tasks for which feedback would be most useful for future learning activities and assessment. This feedback was developed to be quick and efficient for practical demonstrators to mark and thus able to be completed during online Zoom practicals, while maintaining rigor, consistency, and detail in the feedback. We evaluate the system based on student feedback and discuss the feasibility of continuing the use of this feedback system as practical classes return to face-to-face teaching

    WHO’S ZOOMIN’ WHO? STUDENT ENGAGEMENT DURING A FACELESS PANDEMIC

    Get PDF
    COVID-19 brought about a faceless threat that impacted Higher Education, and the whole of Society, in a way that had not been seen in our lifetimes - faceless nanoscopic threat to humanity and to the way we facilitate learning as passionate educators. The impacts on students, especially those of international status, were substantial. Isolation, enforced by the Federal Government, was as new to us all as the coronavirus was to our species. As a social species, we had no prior ‘immunity’ to the experience and the scramble to adapt within the teaching and learning environment that raised many difficulties experienced by all participants. However, nowhere was this more adversely felt than in our Transitioning First Year Student (TFYS) cohort. Students, already having to adapt to the foreign experience of learning in a higher education landscape, were dealt the additional blow of doing so under Australian government enforced isolation. Most, far from home and from their social networks, found themselves living an experience devoid of the social experience that is expected during their first year of university. A time where many form peer groups which last beyond higher education; a time when you have the opportunity to ‘navigate your identity’ amongst peers, was taken from them. Regardless of the impact on grade distributions, the effects of isolation were seen in overt and distressing declines in mental health across the first-year cohort (pers obs.). But what about those that did not feel they could, or did not know how to, reach out and engage? Here we show how collegiality, honesty and a peer-like approach to Coordination helped lessen this. However, despite these efforts, many students remained ‘invisible’ – hiding behind the smokescreens of upheaval and workloads, and most evidently behind the black square of a faceless Zoom. We also explore alternate ways of fostering inclusion through the scaffolding of ‘social engagement’ amongst the students themselves

    Engaging students in scenario-based assessment for final exams

    Get PDF
    We present our approaches to enhancing the authenticity of final exams across large first-year first semester biology units of cohort sizes between 300-1200 students. Historically exams were primarily used as an instrument that mainly assessed knowledge retention with limited provision of feedback to students. The necessity to shift to online learning during the height of the COVID-19 pandemic provided us with a challenging, yet opportune moment to transform our final examinations into an authentic learning experience for undergraduate biology students. We placed a large focus on integrating scenario-based questions in the final exam thereby assessing students’ ability to apply knowledge to real-world contexts. To enhance engagement with the assessment, we also provided personalised feedback for each student. With additional challenges around access to artificial intelligence and academic integrity, we share our experiences returning to in-person final examinations and evaluate the relevancy and benefits of scenario-based questions for student assessment and learning. We also share our approaches to feedforwarding initiatives to prepare students for examinations that is different to what most students would have experienced in their secondary schooling
    corecore